topological invariant
Why is topology hard to learn?
Oriekhov, D. O., Bergkamp, Stan, Jin, Guliuxin, Luna, Juan Daniel Torres, Zouggari, Badr, van der Meer, Sibren, Yazidi, Naoual El, Greplova, Eliska
Phase classification has become a prototypical benchmark for data-driven analysis of condensed matter physics. The type and complexity of the phase transition dictate the level of complexity of the algorithm one has to employ. This topic has been broadly explored, offering a menu of both supervised and unsupervised techniques ranging from simple clustering [1-3] to more complex machine learning methods [4-7]. The phase classification problem is most commonly posed like so: we allow our model to view a dataset that is both relevant and straightforwardly obtainable in the scenario we wish to study. We introduce this data set to a model that has no prior knowledge of underlying physics.
Noncommutative Model Selection and the Data-Driven Estimation of Real Cohomology Groups
Guzmán-Tristán, Araceli, Rieser, Antonio, Velázquez-Richards, Eduardo
We propose three completely data-driven methods for estimating the real cohomology groups $H^k (X ; \mathbb{R})$ of a compact metric-measure space $(X, d_X, \mu_X)$ embedded in a metric-measure space $(Y,d_Y,\mu_Y)$, given a finite set of points $S$ sampled from a uniform distrbution $\mu_X$ on $X$, possibly corrupted with noise from $Y$. We present the results of several computational experiments in the case that $X$ is embedded in $\mathbb{R}^n$, where two of the three algorithms performed well.
- North America > United States > Rhode Island > Providence County > Providence (0.04)
- North America > Mexico > Guanajuato (0.04)
- North America > United States > South Carolina > Richland County > Columbia (0.04)
- (7 more...)
Topological Tensor Eigenvalue Theorems in Data Fusion
This paper introduces a novel framework for tensor eigenvalue analysis in the context of multi-modal data fusion, leveraging topological invariants such as Betti numbers. While traditional approaches to tensor eigenvalues rely on algebraic extensions of matrix theory, this work provides a topological perspective that enriches the understanding of tensor structures. By establishing new theorems linking eigenvalues to topological features, the proposed framework offers deeper insights into the latent structure of data, enhancing both interpretability and robustness. Applications to data fusion illustrate the theoretical and practical significance of the approach, demonstrating its potential for broad impact across machine learning and data science domains.
Deep learning for the design of non-Hermitian topolectrical circuits
Chen, Xi, Sun, Jinyang, Wang, Xiumei, Jiang, Hengxuan, Zhu, Dandan, Zhou, Xingping
Non-Hermitian topological phases can produce some remarkable properties, compared with their Hermitian counterpart, such as the breakdown of conventional bulk-boundary correspondence and the non-Hermitian topological edge mode. Here, we introduce several algorithms with multi-layer perceptron (MLP), and convolutional neural network (CNN) in the field of deep learning, to predict the winding of eigenvalues non-Hermitian Hamiltonians. Subsequently, we use the smallest module of the periodic circuit as one unit to construct high-dimensional circuit data features. Further, we use the Dense Convolutional Network (DenseNet), a type of convolutional neural network that utilizes dense connections between layers to design a non-Hermitian topolectrical Chern circuit, as the DenseNet algorithm is more suitable for processing high-dimensional data. Our results demonstrate the effectiveness of the deep learning network in capturing the global topological characteristics of a non-Hermitian system based on training data.
- Asia > China > Jiangsu Province > Nanjing (0.05)
- Asia > China > Shanghai > Shanghai (0.04)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
Let's do the time-warp-attend: Learning topological invariants of dynamical systems
Moriel, Noa, Ricci, Matthew, Nitzan, Mor
Dynamical systems across the sciences, from electrical circuits to ecological networks, undergo qualitative and often catastrophic changes in behavior, called bifurcations, when their underlying parameters cross a threshold. Existing methods predict oncoming catastrophes in individual systems but are primarily time-series-based and struggle both to categorize qualitative dynamical regimes across diverse systems and to generalize to real data. To address this challenge, we propose a data-driven, physically-informed deep-learning framework for classifying dynamical regimes and characterizing bifurcation boundaries based on the extraction of topologically invariant features. We focus on the paradigmatic case of the supercritical Hopf bifurcation, which is used to model periodic dynamics across a wide range of applications. Our convolutional attention method is trained with data augmentations that encourage the learning of topological invariants which can be used to detect bifurcation boundaries in unseen systems and to design models of biological systems like oscillatory gene regulatory networks. We further demonstrate our method's use in analyzing real data by recovering distinct proliferation and differentiation dynamics along pancreatic endocrinogenesis trajectory in gene expression space based on single-cell data. Our method provides valuable insights into the qualitative, long-term behavior of a wide range of dynamical systems, and can detect bifurcations or catastrophic transitions in large-scale physical and biological systems.
Topological, or Non-topological? A Deep Learning Based Prediction
Rasul, Ashiqur, Hossain, Md Shafayat, Dastider, Ankan Ghosh, Roy, Himaddri, Hasan, M. Zahid, Khosru, Quazi D. M.
Prediction and discovery of new materials with desired properties are at the forefront of quantum science and technology research. A major bottleneck in this field is the computational resources and time complexity related to finding new materials from ab initio calculations. In this work, an effective and robust deep learning-based model is proposed by incorporating persistent homology and graph neural network which offers an accuracy of 91.4% and an F1 score of 88.5% in classifying topological vs. non-topological materials, outperforming the other state-of-the-art classifier models. The incorporation of the graph neural network encodes the underlying relation between the atoms into the model based on their own crystalline structures and thus proved to be an effective method to represent and process non-euclidean data like molecules with a relatively shallow network. The persistent homology pipeline in the suggested neural network is capable of integrating the atom-specific topological information into the deep learning model, increasing robustness, and gain in performance. It is believed that the presented work will be an efficacious tool for predicting the topological class and therefore enable the high-throughput search for novel materials in this field.
- North America > United States > New Jersey > Mercer County > Princeton (0.04)
- North America > United States > California > San Diego County > San Diego (0.04)
- Europe > Austria > Vienna (0.04)
- Asia > Bangladesh > Dhaka Division > Dhaka District > Dhaka (0.04)
Identifying topology of leaky photonic lattices with machine learning
Smolina, Ekaterina O., Smirnov, Lev A., Leykam, Daniel, Nori, Franco, Smirnova, Daria A.
We show how machine learning techniques can be applied for the classification of topological phases in leaky photonic lattices using limited measurement data. We propose an approach based solely on bulk intensity measurements, thus exempt from the need for complicated phase retrieval procedures. In particular, we design a fully connected neural network that accurately determines topological properties from the output intensity distribution in dimerized waveguide arrays with leaky channels, after propagation of a spatially localized initial excitation at a finite distance, in a setting that closely emulates realistic experimental conditions.
- North America > United States > Michigan > Washtenaw County > Ann Arbor (0.14)
- Asia > Singapore (0.04)
- Europe > Russia > Volga Federal District > Nizhny Novgorod Oblast > Nizhny Novgorod (0.04)
- (3 more...)
Deep Neural Networks as the Semi-classical Limit of Topological Quantum Neural Networks: The problem of generalisation
Marciano, Antonino, Chen, Deen, Fabrocini, Filippo, Fields, Chris, Lulli, Matteo, Zappala, Emanuele
Deep Neural Networks miss a principled model of their operation. A novel framework for supervised learning based on Topological Quantum Field Theory that looks particularly well suited for implementation on quantum processors has been recently explored. We propose the use of this framework for understanding the problem of generalization in Deep Neural Networks. More specifically, in this approach Deep Neural Networks are viewed as the semi-classical limit of Topological Quantum Neural Networks. A framework of this kind explains easily the overfitting behavior of Deep Neural Networks during the training step and the corresponding generalization capabilities.
- Asia > China > Shanghai > Shanghai (0.04)
- Europe > Italy > Lazio > Rome (0.04)
- North America > United States > Massachusetts > Middlesex County > Medford (0.04)
- (5 more...)
- Social Sector (0.46)
- Health & Medicine (0.46)
Nonparametric Topological Layers in Neural Networks
Various topological techniques and tools have been applied to neural networks in terms of network complexity, explainability, and performance. One fundamental assumption of this line of research is the existence of a global (Euclidean) coordinate system upon which the topological layer is constructed. Despite promising results, such a \textit{topologization} method has yet to be widely adopted because the parametrization of a topologization layer takes a considerable amount of time and more importantly, lacks a theoretical foundation without which the performance of the neural network only achieves suboptimal performance. This paper proposes a learnable topological layer for neural networks without requiring a Euclidean space; Instead, the proposed construction requires nothing more than a general metric space except for an inner product, i.e., a Hilbert space. Accordingly, the according parametrization for the proposed topological layer is free of user-specified hyperparameters, which precludes the costly parametrization stage and the corresponding possibility of suboptimal networks.
- North America > United States > Nevada > Washoe County > Reno (0.04)
- North America > Canada > Quebec > Montreal (0.04)
Deep Learning for Topological Invariants
Sun, Ning, Yi, Jinmin, Zhang, Pengfei, Shen, Huitao, Zhai, Hui
In this work we design and train deep neural networks to predict topological invariants for one-dimensional four-band insulators in AIII class whose topological invariant is the winding number, and two-dimensional two-band insulators in A class whose topological invariant is the Chern number. Given Hamiltonians in the momentum space as the input, neural networks can predict topological invariants for both classes with accuracy close to or higher than 90%, even for Hamiltonians whose invariants are beyond the training data set. Despite the complexity of the neural network, we find that the output of certain intermediate hidden layers resembles either the winding angle for models in AIII class or the solid angle (Berry curvature) for models in A class, indicating that neural networks essentially capture the mathematical formula of topological invariants. Our work demonstrates the ability of neural networks to predict topological invariants for complicated models with local Hamiltonians as the only input, and offers an example that even a deep neural network is understandable.
- Asia > Japan (0.05)
- Asia > China > Beijing > Beijing (0.05)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
- Asia > Middle East > Jordan (0.04)